New Notions and Mechanisms for Statistical Privacy
نویسنده
چکیده
Title of dissertation: NEW NOTIONS AND MECHANISMS FOR STATISTICAL PRIVACY Adam Groce, Doctor of Philosophy, 2014 Dissertation directed by: Professor Jonathan Katz Department of Computer Science Many large databases of personal information currently exist in the hands of corporations, nonprofits, and governments. The data in these databases could be used to answer any number of important questions, aiding in everything from basic research to day-to-day corporate decision-making. These questions must be answered while respecting the privacy of the individuals whose data are being used. However, even defining privacy in this setting can be difficult. The standard definition in the field is differential privacy [25]. During the years since its introduction, a wide variety of query algorithms have been found that can achieve meaningful utility while at the same time protecting the privacy of individuals. However, differential privacy is a very strong definition, and in some settings it can seem too strong. Given the difficulties involved in getting differentially private output to all desirable queries, many have looked for ways to weaken differential privacy without losing its meaningful privacy guarantees. Here we discuss two such weakenings. The first is computational differential privacy, originally defined by Mironov et al. [56]. We find the promise of this weakening to be limited. We show two results that severely curtail the potential for computationally private mechanisms to add any utility over those that achieve standard differential privacy when working in the standard setting with all data held by a single entity. We then propose our own weakening, coupled-worlds privacy. This definition is meant to capture the cases where reasonable bounds can be placed on the adversary’s certainty about the data (or, equivalently, the adversary’s auxiliary information). We discuss the motivation for the definition, its relationship to other definitions in the literature, and its useful properties. Coupled-worlds privacy is actually a framework with which specific definitions can be instantiated, and we discuss a particular instantiation, distributional differential privacy, which we believe is of particular interest. Having introduced this definition, we then seek new distributionally differentially private query algorithms that can release useful information without the need to add noise, as is necessary when satisfying differential privacy. We show that one can release a variety of query output with distributional differential privacy, including histograms, sums, and least-squares regression lines. NEW NOTIONS AND MECHANISMS FOR STATISTICAL PRIVACY
منابع مشابه
An Axiomatic View of Statistical Privacy and Utility
“Privacy” and “utility” are words that frequently appear in the literature on statistical privacy. But what do these words really mean? In recent years, many problems with intuitive notions of privacy and utility have been uncovered. Thus more formal notions of privacy and utility, which are amenable to mathematical analysis, are needed. In this paper we present our initial work on an axiomatiz...
متن کاملDynamic Differential Location Privacy with Personalized Error Bounds
Location privacy continues to attract significant attentions in recent years, fueled by the rapid growth of locationbased services (LBSs) and smart mobile devices. Location obfuscation has been the dominating location privacy preserving approach, which transforms the exact location of a mobile user to a perturbed location before its public release. The notion of location privacy has evolved fro...
متن کاملRényi Differential Privacy
We propose a natural relaxation of differential privacy based on the Rényi divergence. Closely related notions have appeared in several recent papers that analyzed composition of differentially private mechanisms. We argue that the useful analytical tool can be used as a privacy definition, compactly and accurately representing guarantees on the tails of the privacy loss. We demonstrate that th...
متن کاملNovel differentially private mechanisms for graphs
In this paper, we introduce new methods for releasing differentially private graphs. Our techniques are based on a new way to distribute noise among edges weights. More precisely, we rely on the addition of noise whose amplitude is edge-calibrated and optimize the distribution of the privacy budget among subsets of edges. The generic privacy framework that we propose can capture all privacy not...
متن کاملClosing the Gap: A Universal Privacy Framework for Outsourced Data
We study formal privacy notions for data outsourcing schemes. The aim of our efforts is to define a security framework that is applicable to highly elaborate as well as practical constructions. First, we define the privacy objectives data privacy, query privacy, and result privacy. We then investigate fundamental relations among them. Second, to make them applicable to practical constructions, ...
متن کامل